Multi-task learning on nuclear masses and separation energies with the kernel ridge regression

نویسندگان

چکیده

A multi-task learning (MTL) framework, called gradient kernel ridge regression, for nuclear masses and separation energies is developed by introducing functions to the regression (KRR) approach. By taking WS4 mass model as an example, KRR network trained with residuals, i.e., deviations between experimental theoretical values of one-nucleon energies, improve accuracy predictions. Significant improvements are achieved approach in both interpolation extrapolation predictions energies. This demonstrates advantage present MTL framework that integrates information improves them.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison bewteen multi-task and single-task oracle risks in kernel ridge regression

In this paper we study multi-task kernel ridge regression and try to understand when the multi-task procedure performs better than the single-task one, in terms of averaged quadratic risk. In order to do so, we compare the risks of the estimators with perfect calibration, the oracle risk. We are able to give explicit settings, favorable to the multi-task procedure, where the multi-task oracle p...

متن کامل

Distributed Semi-supervised Learning with Kernel Ridge Regression

This paper provides error analysis for distributed semi-supervised learning with kernel ridge regression (DSKRR) based on a divide-and-conquer strategy. DSKRR applies kernel ridge regression (KRR) to data subsets that are distributively stored on multiple servers to produce individual output functions, and then takes a weighted average of the individual output functions as a final estimator. Us...

متن کامل

Estimating Predictive Variances with Kernel Ridge Regression

In many regression tasks, in addition to an accurate estimate of the conditional mean of the target distribution, an indication of the predictive uncertainty is also required. There are two principal sources of this uncertainty: the noise process contaminating the data and the uncertainty in estimating the model parameters based on a limited sample of training data. Both of them can be summaris...

متن کامل

Multi-task Multiple Kernel Learning

This paper presents two novel formulations for learning shared feature representations across multiple tasks. The idea is to pose the problem as that of learning a shared kernel, which is constructed from a given set of base kernels, leading to improved generalization in all the tasks. The first formulation employs a (l1, lp), p ≥ 2 mixed norm regularizer promoting sparse combinations of the ba...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physics Letters B

سال: 2022

ISSN: ['0370-2693', '1873-2445']

DOI: https://doi.org/10.1016/j.physletb.2022.137394